Your browser doesn't support javascript.
Show: 20 | 50 | 100
Results 1 - 2 de 2
Filter
Add filters

Language
Document Type
Year range
1.
International Journal of Information Management Data Insights ; 3(1), 2023.
Article in English | Scopus | ID: covidwho-2234529

ABSTRACT

Social Media today has become the most relevant and affordable platform to express one's views in real-time. The #Endsars protest in Nigeria and the COVID-19 pandemic have proven how important and reliant both government agencies and individuals are on social media. This research uses tweets collected from Twitter API to identify and classify transportation disasters in Nigeria. Information such as the user, location, and time of the tweet makes identification and classification of transportation disasters available in real-time. Bidirectional Encoder Representations from Transformers (BERT) uses a transformer that includes two separate mechanisms, a decoder that produces a prediction for the task and an encoder that reads the text input. It learns contextual relations between words (sub-words) in a text. This research applied BERT with a combination of AdamW optimizers. AdamW is an improved version of stochastic gradient descent that computes an adaptive learning rate for each parameter. Our proposed model produces an accuracy of 82%. It was concluded that our approach outperformed the existing algorithm: BERT having an accuracy of 64%. © 2023

2.
International Journal of Information Management Data Insights ; 3(1):100154, 2023.
Article in English | ScienceDirect | ID: covidwho-2180698

ABSTRACT

Social Media today has become the most relevant and affordable platform to express one's views in real-time. The #Endsars protest in Nigeria and the COVID-19 pandemic have proven how important and reliant both government agencies and individuals are on social media. This research uses tweets collected from Twitter API to identify and classify transportation disasters in Nigeria. Information such as the user, location, and time of the tweet makes identification and classification of transportation disasters available in real-time. Bidirectional Encoder Representations from Transformers (BERT) uses a transformer that includes two separate mechanisms, a decoder that produces a prediction for the task and an encoder that reads the text input. It learns contextual relations between words (sub-words) in a text. This research applied BERT with a combination of AdamW optimizers. AdamW is an improved version of stochastic gradient descent that computes an adaptive learning rate for each parameter. Our proposed model produces an accuracy of 82%. It was concluded that our approach outperformed the existing algorithm: BERT having an accuracy of 64%.

SELECTION OF CITATIONS
SEARCH DETAIL